AI-generated code and vibe coding: copyright, licensing, and legal risks

  • {Blockchain}
  • {ICO}
  • {STO}
  • {Web3}

Keywords: Vibe coding, artificial intelligence, intellectual property, GitHub Copilot, AI-generated code, AI Act, open source.

Legal analysis by Matthieu Quiniou, Partner IP/IT Lawyer at D&A Partners

Vibe coding refers to the use of generative artificial intelligence tools to produce computer code from natural language instructions.

AI code generation tools such as GitHub Copilot, ChatGPT or Claude now make it possible to rapidly generate functional code. Their use raises significant legal issues relating to the intellectual property of the generated code, open-source licensing, liability in the event of bugs, and compliance with the European Artificial Intelligence Act.

This guide answers the main legal questions surrounding vibe coding and AI-generated code.

Key takeaways

  • AI-generated code may be protected by copyright if creative human input can be demonstrated.
  • The use of vibe coding may expose developers to open-source license contamination risks (such as GNU General Public License or GNU Affero General Public License).
  • Liability for software generally remains with the company that deploys the software into production, even when AI tools were used during development.
  • Companies should implement code and license audits before any production deployment.

1. Understanding vibe coding

What is vibe coding and how does it work?

Vibe coding is a programming practice made possible by generative artificial intelligence models, which allows computer code to be created from instructions formulated as prompts.

Large language models (LLMs), trained on large volumes of data and digital content, may have been trained on corpora including code from public repositories such as GitHub or GitLab, as well as other data sources.

Although vibe coding can be used by experienced developers as a programming assistance tool, this practice also helps to democratize access to software development. It allows people with little or no programming knowledge to generate code from instructions formulated in natural language.

2. Training AI models used for vibe coding

Can you object to your code being used to train AI models?

Theoretically, yes, but in practice it’s more complicated.

The European AI Regulation (EU) 2024/1689 of June 13, 2024 refers to the Copyright Directive 2019/790 of April 17, 2019, which provides in Articles 3 and 4 for an exception to copyright for text and data mining (known as the “TDM exception”).

This exception allows the reproduction and extraction of lawfully accessible works for the purposes of text and data mining, except where rights holders have expressly objected to this using a machine-readable process, for example with devices such as robots.txt.

In practice, the effectiveness of this right to object remains limited. Opt-out mechanisms are still imperfectly standardized, and it is often difficult to verify whether these reservations are actually respected when training model databases are created. This difficulty also exists for computer code. License files, readme files, or comments in repositories are rarely taken into account during automated data collection and the creation of AI model training databases.

The European AI Regulation, through the work of the AI Office, provides for certain transparency obligations for providers of general-purpose AI models, including the obligation to implement a copyright compliance policy and to document the training data used. However, the technical opacity of model training systems, often described as black boxes and covered by trade secrets and business secrets, does not in practice allow rights holders to assert their opposition to training based on their creations.

Are developers compensated when their code is used to train AI?

This is still quite rare, but the issue is being debated, as code is, under certain conditions, eligible for copyright protection, raising the question of value sharing or collective remuneration. Several lawsuits have already been filed concerning the use of open source code to train AI systems without attribution to the original developers, notably in the GitHub Copilot case (J. DOE 1 v. GitHub Inc., Northern District of California, Case 3:22-cv-06823, Nov. 3, 2022).

3. Intellectual property of AI-generated code

Is computer code protected by copyright?

Computer code is indeed protected by copyright, the essential criterion for assessment being originality.

It is settled case law (Court of Cassation, Plenary Assembly, March 7, 1986, 83-10.477, Babolat case) that originality in computer code is assessed on the basis of the mark of intellectual contribution characterized by the fact that the author of the code has “demonstrated a personalized effort going beyond the simple implementation of an automatic and restrictive logic.”

Article L 112-2 (13°) of the Intellectual Property Code (CPI) explicitly states that software is a work of the mind.

Is code generated with vibe coding protectable?

Although it is difficult to give a definitive opinion at this stage on the protection of code generated with vibe coding, in the absence of specific case law on the subject, it seems reasonable to consider that the protection of code generated with vibe coding depends mainly on the degree of human intervention in the creation process.

In copyright law, only a creation that reflects the author’s own intellectual contribution can be protected. If the developer uses an AI tool to design the program architecture, formulate precise instructions, and then select, modify, and integrate the generated code, the result should be considered an original work eligible for copyright protection.

Conversely, if the code has been generated in a largely automated manner by an AI system without significant human intervention, protection is more uncertain.

In practice, vibe coding is most often part of a co-creation process between the developer and the AI tool, which leads to the originality being assessed in terms of the choices and decisions made by the developer in the design and structuring of the software.

In summary, the use of an AI system to generate code does not therefore exclude copyright protection, but leads to the analysis of originality being oriented towards the creative choices made by the developer.

Who owns the code created with vibe coding?

Code created with vibe coding belongs in principle to its author, provided that it constitutes a work of the mind that can be protected by copyright. In French law, as in most legal systems, the rights to software belong to the person who made the intellectual contribution that gave rise to the code.

When vibe coding is used as a programming assistance tool, the author will therefore generally be the developer who designs the program architecture, formulates the instructions, and selects or modifies the generated code.

However, two important factors must be taken into account. Firstly, under French law, according to Article L113-9 of the CPI, the economic rights to software created by employees in the course of their duties are transferred to the employer. Secondly, the user licenses or general terms and conditions of use for the AI tools used for vibe coding may include certain rules concerning the use or reuse of the generated code.

It is therefore recommended that these contractual terms and conditions, as well as the framework of the employment or service relationship, be carefully reviewed.

How can human intervention in AI-generated code be proven?

In copyright law, the protection of software presupposes the existence of human intellectual input that characterizes the originality of the work. When code is generated with the help of an AI system, it may therefore be useful to document the developer’s intervention in order to demonstrate that the code is indeed the result of human creative choices.

Documenting the developer’s intervention is an important step in facilitating the recognition of copyright ownership of the generated code.

Several elements can help establish this intervention, for example:

  • keeping prompts and exchanges with the AI tool;
  • successive versions of the code (Git history, commits, modifications);
  • documentation of the software architecture and technical choices made by the developer;
  • traces of editing, integration, and adaptation of the generated code.

In this context, implementing best practices for traceability in the development process becomes an important issue in securing ownership rights to software developed with the help of artificial intelligence tools.

Can AI-generated code violate open source licenses?

Yes, this risk exists and is currently the subject of much legal debate.

The AI systems used for vibe coding are trained on vast corpora of computer code, including open source repositories, for example under the GNU GPL license. In some cases, the generated code may reproduce or be inspired by existing code fragments. If these code snippets come from projects subject to copyleft licenses, their integration into software may create certain contractual obligations, including the obligation to publish the source code under the same license as the original code. These licenses are often referred to as contaminating licenses.

Two legal interpretations are currently being discussed.

The first, and most widespread, considers that the contaminating effect only applies if the generated code actually reproduces identifiable fragments of code subject to a copyleft license. In this case, it is recommended that the generated code be audited, similar to a plagiarism check, in order to detect any matches with open source repositories.

A second, more extensive and currently marginal interpretation is that once an AI model has been trained on code subject to copyleft licenses, the generated code should itself be subject to these licenses. Such an approach would have significant consequences, as it would call into question the possibility of protecting or exploiting AI-generated code in a proprietary manner.

The use of AI-generated code may therefore expose companies to constraints related to open source licenses or the unintentional introduction of problematic dependencies.

What license should be adopted for code developed with the help of AI?

The choice of license for code developed with the help of an artificial intelligence tool depends above all on the strategy of the software project and the legal framework applicable to the generated code. If the code is copyrightable and the developer or company is the copyright holder, it can be distributed under either a proprietary license or a fully or partially open source license (MIT, Apache 2.0, GPL, etc.).

In practice, it is in companies’ interests to implement procedures for auditing the generated code and verifying licenses, similar to those used for managing open source dependencies in traditional software projects.

Can the prompts used to generate code be protected by copyright?

Yes, if these prompts are original, there is no reason why they cannot be protected by copyright as intellectual works.

4. Legal risks of vibe coding

Who is liable in the event of a bug or flaw in AI-generated code?

Liability lies with the person or company that develops, integrates, or makes the software available to users or the public. The use of an artificial intelligence tool to generate code does not transfer liability to the AI provider.

Furthermore, AI code generation systems often include clauses in their terms and conditions of use stating that no guarantee is provided regarding the output.

It is therefore up to developers and companies to carry out the necessary tests, security audits, and code reviews before putting anything into production. Given the current state of the art in technology, it seems inappropriate to require AI systems to guarantee that the generated code is free of bugs or vulnerabilities. AI is a development aid tool, but the ultimate responsibility for software quality and security remains with humans.

What are the risks of confidentiality or information leaks with vibe coding tools?

The use of AI tools to generate code may present confidentiality risks, particularly when developers transmit sensitive code elements or technical information to the system.

These risks are particularly significant when the AI tool is operated via an online service and not deployed locally. Prompts, code snippets, or architecture descriptions submitted to the system may be processed on third-party servers and, depending on the terms of use of the service, may be stored, analyzed, or used to improve the models.

In this context, there is a risk of disclosure of information covered by trade secrets, particularly when a developer submits proprietary code, internal algorithms, or sensitive software architecture elements.

To limit these risks, companies can, in particular:

  • regulate the use of AI tools through internal policies,
  • avoid submitting confidential or strategic code,
  • favor solutions deployed locally or in secure environments,
  • verify the contractual terms and conditions and data processing policies of AI providers.

The use of vibe coding tools must therefore be compatible with trade secret protection obligations and, where applicable, with the company’s internal information security policies.

Can code generated with vibe coding be reused by AI providers to train their models?

There is no absolute answer to this question, as it is generally governed by the licenses and terms of use of generative AI systems. Some AI systems allow users to choose whether their prompts and generated content can be used to improve the model.

5. Best practices for vibe coding in companies

Can AI-generated code be used in commercial software?

In most cases, AI-generated code can be used in commercial software. Several legal precautions must be taken, in particular to verify that the generated code does not reproduce fragments subject to restrictive open source licenses and that the terms of use of the AI tool allow commercial exploitation of the generated code.

What best practices should be adopted before vibe coding or publishing or deploying AI-generated code?

Before going live, it is recommended to:

  • avoid disclosing confidential information or proprietary code when prompting
  • check the terms of use of the AI tool and the rules applicable to the generated outputs
  • conduct a human review of the code and thorough technical testing
  • verify the absence of security vulnerabilities and the robustness of the software
  • perform a license and similarity audit to detect any fragments from software subject to restrictive open source licenses
  • document human intervention in the development process (prompts, modifications, Git history) to secure ownership of rights.

In general, code generated with the help of AI should be considered as code to be verified and audited, rather than code that is ready to be used without control.

Companies using vibe coding tools must therefore integrate these legal issues into their software development practices, particularly with regard to intellectual property, open source licenses, and risk management.

Legal support

D&A Partners advises companies, startups, and developers on legal issues related to artificial intelligence, software intellectual property, and compliance with the European AI regulatory framework.

Last updated: March 2026.

By Matthieu Quiniou – Partner, Lawyer